Search Tree Pruning for Progressive Neural Architecture Search (Student Abstract)
نویسندگان
چکیده
منابع مشابه
Progressive Neural Architecture Search
We propose a method for learning CNN structures that is more efficient than previous approaches: instead of using reinforcement learning (RL) or genetic algorithms (GA), we use a sequential model-based optimization (SMBO) strategy, in which we search for architectures in order of increasing complexity, while simultaneously learning a surrogate function to guide the search, similar to A* search....
متن کاملTree Pruning for New Search Techniques in Computer Games
This paper proposes a new mechanism for pruning a search game-tree in computer chess. The algorithm stores and then reuses chains or sequences of moves, built up from previous searches. These move sequences have a built-in forward-pruning mechanism that can radically reduce the search space. A typical search process might retrieve a move from a Transposition Table, where the decision of what mo...
متن کاملPruning Techniques in Search and Planning (Research Abstract)
Search algorithms often suffer from exploring areas which eventually are not part of the shortest path from the start to a goal. Usually it is the purpose of the heuristic function to guide the search algorithm such that it will ignore as much as possible of these areas. We consider other, non-heuristic methods that can be used to prune the search space to make search even faster. We present tw...
متن کاملDifferentiable Neural Network Architecture Search
The successes of deep learning in recent years has been fueled by the development of innovative new neural network architectures. However, the design of a neural network architecture remains a difficult problem, requiring significant human expertise as well as computational resources. In this paper, we propose a method for transforming a discrete neural network architecture space into a continu...
متن کاملExploring Neural Architecture Search for Language Tasks
Neural architecture search (NAS), the task of finding neural architectures automatically, has recently emerged as a promising approach for discovering better models than ones designed by humans alone. However, most success stories are for vision tasks and have been quite limited for text, except for a small language modeling datasets. In this paper, we explore NAS for text sequences at scale, b...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the AAAI Conference on Artificial Intelligence
سال: 2020
ISSN: 2374-3468,2159-5399
DOI: 10.1609/aaai.v34i10.7163